Asymptotic Theory for Principal Component Analysis

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards theory of generic Principal Component Analysis

In this paper, we consider a technique called the generic Principal Component Analysis (PCA) which is based on an extension and rigorous justification of the standard PCA. The generic PCA is treated as the best weighted linear estimator of a given rank under the condition that the associated covariance matrix is singular. As a result, the generic PCA is constructed in terms of the pseudo-invers...

متن کامل

Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...

متن کامل

Methodology and Theory for Nonnegative-score Principal Component Analysis

We develop nonparametric methods, and theory, for analysing data on a random p-vector Y represented as a linear form in a p-vector X, say Y = AX, where the components of X are nonnegative and uncorrelated. Problems of this nature are motivated by a wide range of applications in which physical considerations deny the possibility that X can have negative components. Our approach to inference is f...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Mathematical Statistics

سال: 1963

ISSN: 0003-4851

DOI: 10.1214/aoms/1177704248